emotional ai
Affective Computing and Emotional Data: Challenges and Implications in Privacy Regulations, The AI Act, and Ethics in Large Language Models
This paper examines the integration of emotional intelligence into artificial intelligence systems, with a focus on affective computing and the growing capabilities of Large Language Models (LLMs), such as ChatGPT and Claude, to recognize and respond to human emotions. Drawing on interdisciplinary research that combines computer science, psychology, and neuroscience, the study analyzes foundational neural architectures - CNNs for processing facial expressions and RNNs for sequential data, such as speech and text - that enable emotion recognition. It examines the transformation of human emotional experiences into structured emotional data, addressing the distinction between explicit emotional data collected with informed consent in research settings and implicit data gathered passively through everyday digital interactions. That raises critical concerns about lawful processing, AI transparency, and individual autonomy over emotional expressions in digital environments. The paper explores implications across various domains, including healthcare, education, and customer service, while addressing challenges of cultural variations in emotional expression and potential biases in emotion recognition systems across different demographic groups. From a regulatory perspective, the paper examines emotional data in the context of the GDPR and the EU AI Act frameworks, highlighting how emotional data may be considered sensitive personal data that requires robust safeguards, including purpose limitation, data minimization, and meaningful consent mechanisms.
- North America > United States > California (0.04)
- North America > United States > Illinois (0.04)
- Europe > Italy (0.04)
- Law (1.00)
- Information Technology > Security & Privacy (1.00)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology > Mental Health (0.50)
Feeling Machines: Ethics, Culture, and the Rise of Emotional AI
Chavan, Vivek, Cenaj, Arsen, Shen, Shuyuan, Bar, Ariane, Binwani, Srishti, Del Becaro, Tommaso, Funk, Marius, Greschner, Lynn, Hung, Roberto, Klein, Stina, Kleiner, Romina, Krause, Stefanie, Olbrych, Sylwia, Parmar, Vishvapalsinhji, Sarafraz, Jaleh, Soroko, Daria, Don, Daksitha Withanage, Zhou, Chang, Vu, Hoang Thuy Duong, Semnani, Parastoo, Weinhardt, Daniel, Andre, Elisabeth, Krüger, Jörg, Fresquet, Xavier
This paper explores the growing presence of emotionally responsive artificial intelligence through a critical and interdisciplinary lens. Bringing together the voices of early-career researchers from multiple fields, it explores how AI systems that simulate or interpret human emotions are reshaping our interactions in areas such as education, healthcare, mental health, caregiving, and digital life. The analysis is structured around four central themes: the ethical implications of emotional AI, the cultural dynamics of human-machine interaction, the risks and opportunities for vulnerable populations, and the emerging regulatory, design, and technical considerations. The authors highlight the potential of affective AI to support mental well-being, enhance learning, and reduce loneliness, as well as the risks of emotional manipulation, over-reliance, misrepresentation, and cultural bias. Key challenges include simulating empathy without genuine understanding, encoding dominant sociocultural norms into AI systems, and insufficient safeguards for individuals in sensitive or high-risk contexts. Special attention is given to children, elderly users, and individuals with mental health challenges, who may interact with AI in emotionally significant ways. However, there remains a lack of cognitive or legal protections which are necessary to navigate such engagements safely. The report concludes with ten recommendations, including the need for transparency, certification frameworks, region-specific fine-tuning, human oversight, and longitudinal research. A curated supplementary section provides practical tools, models, and datasets to support further work in this domain.
- Asia > Japan (0.04)
- South America > Venezuela (0.04)
- North America > United States > Hawaii > Honolulu County > Honolulu (0.04)
- (13 more...)
- Research Report (0.82)
- Instructional Material > Course Syllabus & Notes (0.48)
- Law (1.00)
- Information Technology > Security & Privacy (1.00)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (1.00)
- Government > Regional Government > North America Government > United States Government > FDA (0.46)
Are you 80% angry and 2% sad? Why 'emotional AI' is fraught with problems
It's Wednesday evening and I'm at my kitchen table, scowling into my laptop as I pour all the bile I can muster into three little words: "I love you." My neighbours might assume I'm engaged in a melodramatic call to an ex-partner, or perhaps some kind of acting exercise, but I'm actually testing the limits of a new demo from Hume, a Manhattan-based startup that claims to have developed "the world's first voice AI with emotional intelligence". "We train a large language model that also understands your tone of voice," says Hume's CEO and chief scientist Alan Cowen. "What that enables… is to be able to predict how a given speech utterance or sentence will evoke patterns of emotion." In other words, Hume claims to recognise the emotion in our voices (and in another, non-public version, facial expressions) and respond empathically.
- North America > United States > New York (0.05)
- North America > United States > Massachusetts > Suffolk County > Boston (0.05)
- Europe > Sweden > Stockholm > Stockholm (0.05)
Emotional AI Is No Substitute for Empathy
In 2023, emotional AI--technology that can sense and interact with human emotions--will become one of the dominant applications of machine learning. For instance, Hume AI, founded by Alan Cowen, a former Google researcher, is developing tools to measure emotions from verbal, facial, and vocal expressions. Swedish company Smart Eyes recently acquired Affectiva, the MIT Media Lab spinoff that developed the SoundNet neural network, an algorithm that classifies emotions such as anger from audio samples in less than 1.2 seconds. Even the video platform Zoom is introducing Zoom IQ, a feature that will soon provide users with real-time analysis of emotions and engagement during a virtual meeting. In 2023, tech companies will be releasing advanced chatbots that can closely mimic human emotions to create more empathetic connections with users across banking, education, and health care.
La veille de la cybersécurité
In a recent study from Ritsumeikan Asia Pacific University in Japan, the sociocultural elements that influence Generation Z's acceptance of AI technology are looked at. The company believes it is essential to undertake a study on emotional AI's acceptance among Gen Z because they are the generation most susceptible to it. Over 50% of respondents expressed anxiety about using NCDC overall, although responses varied according to gender, income level, educational level, and religious affiliation. The sociocultural factors affecting Generation Z's acceptance of emotional AI technology are examined in a recent study from Ritsumeikan Asia Pacific University in Japan. Emotional AI, or artificial intelligence that engages with human emotions, is quickly developing to be useful in many applications.
New Study Observes Acceptance of Emotional AI Among Gen Z
A new study out of Ritsumeikan Asia Pacific University in Japan observes the socio-cultural factors that influence the acceptance of AI technology among Generation Z. Emotional AI, which is artificial intelligence that engages human emotions, is quickly growing and being used in a wide range of applications. With that said, it is fairly unregulated at […]
Emotional AI and gen Z: The attitude towards new technology and its concerns
AI has ubiquitous presence in technology. Yet, it had been lacking a crucial feature: the ability to engage human emotions. Algorithms that can sense human emotions and interact with them are quickly becoming mainstream as they come embedded in existing systems. Known as "emotional AI," the new technology achieves this feat through a process called "non-conscious data collection"(NCDC), in which the algorithm collects data on the user's heart and respiration rate, voice tones, micro-facial expressions, gestures, etc. to analyze their moods and personalize its response accordingly. However, the unregulated nature of this technology has raised many ethical and privacy concerns.
- Asia > Southeast Asia (0.06)
- Asia > Japan (0.06)
Emotion recognition AI finding fans among lawyers swaying juries and potential clients
The American Bar Association has taken greater notice of emotional AI as a tool for honing courtroom and marketing performance. It is not clear if the storied group has caught up with the controversy that follows the comparatively new field. On the association's May 18 Legal Rebels podcast, ABA Journal legal affairs writer Victor Li speaks with the CEO of software startup EmotionTrac (a subsidiary of mobile ad tech firm Jinglz) about how an app first designed for the advertising industry reportedly has been adopted by dozens of attorneys. Aaron Itzkowitz is at pains to make clear the difference between facial recognition and affect recognition. At the moment, the use of face biometrics by governments is a growing controversy, and Li would like to stay separate from that debate.
Emotional AI and other 'moonshot' technologies could grow to $6 trillion market by 2030, says Bank of America
"The pace at which themes are transforming businesses is blistering, but the adoption of many technologies -- like smartphones or renewable energy -- have surpassed experts' forecasts by decades, because we often think linearly but progress occurs exponentially," say the strategists. They say a paradigm shift in the explosion of data, faster processing power and the rise of artificial intelligence will bring about the "fastest rollout of disruptive tech in history." And in the big stock universe, an increasing few are showing investors the money. "Over the past 30 years, just 1.5% of companies generated all the net wealth on the global stock market, meaning that actually only a handful of disrupters ("superstar firms") really influence long-term financial markets," says Israel and the team. Here are the 14 technologies: 6G, brain computer interfacing (BCI), emotional artificial intelligence, synthetic biology, immortality, bionic humans, eVTOL (electrical vertical takeoff and landing vehicles), wireless electricity, holograms, metaverse, next-gen batteries, oceantech (ocean energy, precision fishing, etc.), green mining and CCS (negative-emissions technology that captures and stores carbon dioxide before it can be released).
- Energy > Renewable (0.78)
- Banking & Finance > Trading (0.74)
- Materials > Chemicals > Industrial Gases > Liquified Gas (0.36)